The 10-Minute Estimation to Kill Bad Ideas
The Trap
A Product Manager comes to you with an idea: "We should build a recommendation engine for the 'Help' page to reduce support tickets."
- The Senior DS: Says "Okay," pulls the data, spends two weeks building a prototype, and finds out it only reduces tickets by 0.1%.
- The Staff DS: Does "Fermi Sizing" in 10 minutes and kills the project before opening a Jupyter notebook.
The Concept
Named after physicist Enrico Fermi, this is the art of using rough, order-of-magnitude estimates to size a problem with zero data. The goal is not Precision (being exactly right); the goal is Magnitude (knowing if it's even worth doing).
Part 1: The Universal Decomposition
Every product idea, no matter how complex, breaks down into the same three variables. I write this on the whiteboard for every new request:
- Population (The Ceiling): Who is the maximum set of users who could possibly trigger this? (e.g., All users who visit the Help Page).
- Reach (The Funnel): Of those, how many will actually see/interact with your feature? (e.g., Only 50% scroll down far enough).
- Intensity (The Delta): If the model works perfectly, how much do we move the needle? (e.g., We deflect 10% of tickets).
The Rule of 10x:
If you multiply these out and the result isn't 10x the cost of building it, kill it. Do not argue over 5% vs 7%. If it's not an order of magnitude, it's noise.
Part 2: The "Staff Constants" (Memorize These)
You cannot do Fermi Sizing in a meeting if you have to say "Let me check the dashboard." You lose the room.
To operate at the Staff level, you must memorize the "Physics of your Business."
- The Volumes: Daily Active Users (DAU), Monthly Active Users (MAU), Daily Transactions.
- The Rates: Average Conversion Rate, Average Churn Rate, Click-Through Rate (CTR) on main surfaces.
- The Costs: Cost per Ticket (Support), Cost per Query (Compute), Revenue per User (ARPU).
💡 Field Note: At DoorDash, I knew the average "Dasher Wait Time" and "Order Volume" by heart. When someone proposed a feature to "optimize wait time by 1%," I could instantly calculate the dollar value without opening a laptop.
Part 3: The "Soft Kill" Checklist
You will generate many ideas. A healthy team should be killing 10-20% of them immediately based on Fermi sizing.
Use this checklist to "Soft Kill" bad ideas without being a blocker:
- Don't say: "This idea is bad."
- Do say: "I love the intent, but the math is tight. Even if we get 100% adoption, the max impact is only $5k/month. Is there a bigger lever we can pull?"
- Don't say: "I need to analyze the data first."
- Do say: "Let's assume the best-case scenario. If 100% of users click this, does it hit our OKR? No? Then let's skip the analysis."
Case Study: The "Data Quality" Trap
The Request:
The team wanted to build an ML model to filter out "bad quality" data (estimated at 5% of volume) from a training set.
The Fermi Sizing:
- Input: 5% bad data.
- Cost: 2 Engineers for 1 month to build the filter.
- Benefit: Marginal model improvement (maybe 0.5% AUC).
The Staff Pivot (First Principles):
Instead of asking "How do we filter it?", I asked: "Why are we paying to collect bad data?"
We were spending resources to ingest data, only to spend more resources to delete it. That is negative leverage.
The Fix:
We killed the ML project and sent one engineer to update the data collection guidelines. We stopped "cleaning the river" and started "filtering the pipe."